Expectation Consistent Approximate Inference
نویسندگان
چکیده
We propose a novel framework for deriving approximations for intractable probabilistic models. This framework is based on a free energy (negative log marginal likelihood) and can be seen as a generalization of adaptive TAP [1-3] and expectation propagation (EP) [4,5] The free energy is constructed from two approximating distributions which encode different aspects of the intractable model such a single node constraints and couplings and are by construction consistent on a chosen set of moments. We test the framework on a difficult benchmark problem with binary variables on fully connected graphs and 2D grid graphs. We find good performance using sets of moments which either specify factorized nodes or a spanning tree on the nodes (structured approximation). Surprisingly, the Bethe approximation gives very inferior results even on grids.
منابع مشابه
Approximate inference techniques with expectation constraints
This article discusses inference problems in probabilistic graphical models that often occur in a machine learning setting. In particular it presents a unified view of several recently proposed approximation schemes. Expectation consistent approximations and expectation propagation are both shown to be related to Bethe free energies with weak consistency constraints, i.e. free energies where lo...
متن کاملAn Approximate Inference Approach for the PCA Reconstruction Error
The problem of computing a resample estimate for the reconstruction error in PCA is reformulated as an inference problem with the help of the replica method. Using the expectation consistent (EC) approximation, the intractable inference problem can be solved efficiently with two variational parameters. A perturbative correction to the result is computed and an alternative simplified derivation ...
متن کاملAssessing Approximate Inference for Binary Gaussian Process Classification
Gaussian process priors can be used to define flexible, probabilistic classification models. Unfortunately exact Bayesian inference is analytically intractable and various approximation techniques have been proposed. In this work we review and compare Laplace’s method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model. We present a...
متن کاملImproving on Expectation Propagation
A series of corrections is developed for the fixed points of Expectation Propagation (EP), which is one of the most popular methods for approximate probabilistic inference. These corrections can lead to improvements of the inference approximation or serve as a sanity check, indicating when EP yields unrealiable results.
متن کاملIterative Refinement of Approximate Posterior for Training Directed Belief Networks
Deep directed graphical models, while a potentially powerful class of generative representations, are challenging to train due to difficult inference. Recent advances in variational inference that make use of an inference or recognition network have advanced well beyond traditional variational inference and Markov chain Monte Carlo methods. While these techniques offer higher flexibility as wel...
متن کاملOn the Concentration of Expectation and Approximate Inference in Layered Networks
We present an analysis of concentration-of-expectation phenomena in layered Bayesian networks that use generalized linear models as the local conditional probabilities. This framework encompasses a wide variety of probability distributions, including both discrete and continuous random variables. We utilize ideas from large deviation analysis and the delta method to devise and evaluate a class ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 6 شماره
صفحات -
تاریخ انتشار 2005